Backpropagation without Multiplication

نویسندگان

  • Patrice Y. Simard
  • Hans Peter Graf
چکیده

Hans Peter Graf AT&T Bell Laboratories Holmdel, NJ 07733 The back propagation algorithm has been modified to work without any multiplications and to tolerate comput.ations with a low resolution, which makes it. more attractive for a hardware implementatioll. Numbers are represented in float.ing point format with 1 bit mantissa and 3 bits in the exponent for the states, and 1 bit mantissa and 5 bit exponent. for the gradients, while the weights are 16 bit fixed-point numbers. In this way, all the computations can be executed with shift and add operations . Large nehvorks with over 100,000 weights were t.rained and demonstrat.ed the same performance as networks comput.ed with full precision. An estimate of a circuit implementatioll shows that a large network can be placed on a single chip , reaching more t.han 1 billion weight updat.es pel' second. A speedup is also obtained on any machine where a multiplication is slower than a shift operat.ioJl.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hardware Implementation of the Backpropagation without Multiplication

The back propagation algorithm has been modi ed to work without any multiplications and to tolerate computations with a low resolution, which makes it more attractive for a hardware implementation. Numbers are represented in oating-point format with 1 bit mantissa and 2 bits in the exponent for the states, and 1 bit mantissa and 4 bit exponent for the gradients, while the weights are 16 bit xed...

متن کامل

A Differentiable Transition Between Additive and Multiplicative Neurons

Existing approaches to combine both additive and multiplicative neural units either use a fixed assignment of operations or require discrete optimization to determine what function a neuron should perform. However, this leads to an extensive increase in the computational complexity of the training procedure. We present a novel, parameterizable transfer function based on the mathematical concept...

متن کامل

A Neural Transfer Function for a Smooth and Differentiable Transition Between Additive and Multiplicative Interactions

Existing approaches to combine both additive and multiplicative neural units either use a fixed assignment of operations or require discrete optimization to determine what function a neuron should perform. This leads either to an inefficient distribution of computational resources or an extensive increase in the computational complexity of the training procedure. We present a novel, parameteriz...

متن کامل

A Parallelized Matrix-Multiplication Implementation of Neural Network for Collision Free Robot Path Planning

This paper covers the problem of Collision Free Path Planning for an autonomous robot and proposes a solution to it through the Back Propagation Neural Network. The solution is transformed into a composition of matrix-multiplication operations, which is a classic example of problems that can be efficiently parallelized. This paper finally proposes a parallel implementation of this matrix-multip...

متن کامل

Learning in the Machine: Random Backpropagation and the Learning Channel

Abstract: Random backpropagation (RBP) is a variant of the backpropagation algorithm for training neural networks, where the transpose of the forward matrices are replaced by fixed random matrices in the calculation of the weight updates. It is remarkable both because of its effectiveness, in spite of using random matrices to communicate error information, and because it completely removes the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993